Two-Stage Procrustes Rotation with Sparse Target Matrix and Least Squares Criterion with Regularization and Generalized Weighting
نویسندگان
چکیده
In factor analysis, a loading matrix is often rotated to simple target for its simplicity. For the purpose, Procrustes rotation minimizes discrepancy between and loadings using two types of approximation: 1) approximate zeros in by non-zeros loadings, 2) loadings. The central issue considered article that it equally treats approximation, while former more important simplifying matrix. Furthermore, well-known Simplimax computational inefficiency estimating sparse matrix, which yields considerable number local minima. research proposes new procedure consists following stages. first stage estimates with lesser cost regularization technique. second stage, target, emphasizing on approximation least squares criterion generalized weighing newly proposed study. simulation study real data examples revealed method surely simplifies matrices.
منابع مشابه
Sparse regularization for least-squares AVP migration
This paper presents least-squares wave equation AVP (Amplitude versus Ray Parameter) migration with non-quadratic regularization. We pose migration as an inverse problem and propose a cost function that makes use of a priori information about the AVP common image gather. In particular, we introduce two regularization goals: smoothness along the ray parameter direction and sparseness in the dept...
متن کاملthe past hospitalization and its association with suicide attempts and ideation in patients with mdd and comparison with bmd (depressed type) group
چکیده ندارد.
Least Squares Optimization with L1-Norm Regularization
This project surveys and examines optimization approaches proposed for parameter estimation in Least Squares linear regression models with an L1 penalty on the regression coefficients. We first review linear regression and regularization, and both motivate and formalize this problem. We then give a detailed analysis of 8 of the varied approaches that have been proposed for optimizing this objec...
متن کاملNonlinear least squares and regularization
I present and discuss some general ideas about iterative nonlinear output least-squares methods. The main result is that, if it is possible to do forward modeling on a physical problem in a way that permits the output (i.e., the predicted values of some physical parameter that could be measured) and the rst derivative of the same output with respect to the model parameters (whatever they may be...
متن کاملA Generalized Least Squares Matrix Decomposition
Variables in high-dimensional data sets common in neuroimaging, spatial statistics, time series and genomics often exhibit complex dependencies that can arise, for example, from spatial and/or temporal processes or latent network structures. Conventional multivariate analysis techniques often ignore these relationships. We propose a generalization of the singular value decomposition that is app...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Open Journal of Statistics
سال: 2023
ISSN: ['2161-7198', '2161-718X']
DOI: https://doi.org/10.4236/ojs.2023.132014